Skip to content

drivers: flash: fix RWW issues in flash_mcux_flexspi_hyperflash #88469

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Conversation

danieldegrasse
Copy link
Contributor

Apply a few fixes to the NXP FlexSPI hyperflash driver to prevent flash read access from the core while programming the flash. These fixes were tested with #85254 on an RT1050-EVK, as this is where I encountered the crashes.

Note that the ICACHE/DCACHE disable step is based on what is done here: https://github.com/nxp-mcuxpresso/mcux-sdk-examples/blob/52b428258efda7d5bd8a2ace2195ca828356743a/evkbimxrt1050/driver_examples/flexspi/hyper_flash/polling_transfer/flexspi_hyper_flash_ops.c#L279

Comment on lines 424 to 425
L1CACHE_DisableICache();
L1CACHE_DisableDCache();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Shouldn't you use the sys_cache_ functions instead? In case the CACHE_MANAGEMENT is turned off completely?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, probably best to get this conversion done. Changed to use sys_cache* functions in latest push

butok
butok previously approved these changes Apr 14, 2025
@danieldegrasse danieldegrasse changed the title drivers: flash: flash_mcux_flexspi_hyperflash drivers: flash: fix RWW issues in flash_mcux_flexspi_hyperflash Apr 14, 2025
@danieldegrasse danieldegrasse force-pushed the fix/rt1050-hyperflash branch from 7f23e09 to d3bcc61 Compare April 14, 2025 14:58
@de-nordic de-nordic assigned mmahadevan108 and unassigned de-nordic Apr 15, 2025
@de-nordic
Copy link
Contributor

NXP specific, re-assigned to @mmahadevan108.

@de-nordic de-nordic removed their request for review April 24, 2025 18:26
@danieldegrasse
Copy link
Contributor Author

@mmahadevan108 can you take a look?

Don't access device pointer from critical sections when programming the
hyperflash, as this could cause a RWW hazard

Signed-off-by: Daniel DeGrasse <ddegrasse@tenstorrent.com>
Disable the cache during erase and programming operations, as cache
pre-fetch operations can cause flash access outside of the application's
control

Also, reduce the SCLK frequency used after erase operations to 200MHz.
Without this, the RT1050 appears to hang after flash program operations

Signed-off-by: Daniel DeGrasse <ddegrasse@tenstorrent.com>
@danieldegrasse
Copy link
Contributor Author

@mmahadevan108 PTAL- have rebased to get a new CI run.

@danieldegrasse danieldegrasse force-pushed the fix/rt1050-hyperflash branch from d3bcc61 to c3971f0 Compare June 24, 2025 21:21
Copy link

@dleach02 dleach02 self-assigned this Jun 26, 2025
@Raymond0225
Copy link
Contributor

Did you guys notice that a similiar fix on another PR I submitted a week before?
#91853
Is there a way to avoid such duplicated works?

@danieldegrasse
Copy link
Contributor Author

Did you guys notice that a similiar fix on another PR I submitted a week before?
#91853
Is there a way to avoid such duplicated works?

I usually go searching for PRs if I'm about to start fixing a driver- that being said, I think this PR sort of went unnoticed (partially because I haven't been pushing to get it merged)

@danieldegrasse
Copy link
Contributor Author

Testing with #85254, it looks like #91853 has solved the issues I ran into on the RT1050. Still might be worth adding the sys_cache calls if we continue running into issues, but closing this for now

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
area: Flash platform: NXP Drivers NXP Semiconductors, drivers
Projects
None yet
Development

Successfully merging this pull request may close these issues.

8 participants